Effect of nonlinear transformations on correlation between weighted sums in multilayer perceptrons

نویسندگان

  • Sang-Hoon Oh
  • Youngjik Lee
چکیده

Nonlinear transformation is one of the major obstacles to analyzing the properties of multilayer perceptrons. In this letter, we prove that the correlation coefficient between two jointly Gaussian random variables decreases when each of them is transformed under continuous nonlinear transformations, which can be approximated by piecewise linear functions. When the inputs or the weights of a multilayer perceptron are perturbed randomly, the weighted sums to the hidden neurons are asymptotically jointly Gaussian random variables. Since sigmoidal transformation can be approximated piecewise linearly, the correlations among the weighted sums decrease under sigmoidal transformations. Based on this result, we can say that sigmoidal transformation used as the transfer function of the multilayer perceptron reduces redundancy in the information contents of the hidden neurons.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Acoustic adaptation using nonlinear transformations of HMM parameters

Speech recognition performance degrades significantly when there is a mismatch between testing and training conditions. Linear transformation-based maximum-likelihood (ML) techniques have been proposed recently to tackle this problem. In this paper, we extend this approach to use nonlinear transformations. These are implemented by multilayer perceptrons (MLPs) which transform the Gaussian means...

متن کامل

Efficient training of multilayer perceptrons using principal component analysis.

A training algorithm for multilayer perceptrons is discussed and studied in detail, which relates to the technique of principal component analysis. The latter is performed with respect to a correlation matrix computed from the example inputs and their target outputs. Typical properties of the training procedure are investigated by means of a statistical physics analysis in models of learning re...

متن کامل

Hyperparameter Optimization with Factorized Multilayer Perceptrons

In machine learning, hyperparameter optimization is a challenging task that is usually approached by experienced practitioners or in a computationally expensive brute-force manner such as grid-search. Therefore, recent research proposes to use observed hyperparameter performance on already solved problems (i.e. data sets) in order to speed up the search for promising hyperparameter configuratio...

متن کامل

Multi-Layer Perceptrons and Symbolic Data

In some real world situations, linear models are not sufficient to represent accurately complex relations between input variables and output variables of a studied system. Multilayer Perceptrons are one of the most successful non-linear regression tool but they are unfortunately restricted to inputs and outputs that belong to a normed vector space. In this chapter, we propose a general recoding...

متن کامل

Strong Laws for Weighted Sums of Negative Dependent Random Variables

In this paper, we discuss strong laws for weighted sums of pairwise negatively dependent random variables. The results on i.i.d case of Soo Hak Sung [9] are generalized and extended.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 5 3  شماره 

صفحات  -

تاریخ انتشار 1994